Multiplicative Updates for NMF with $\beta$-Divergences under Disjoint Equality Constraints

نویسندگان

چکیده

Nonnegative matrix factorization (NMF) is the problem of approximating an input nonnegative matrix, $V$, as product two smaller matrices, $W$ and $H$. In this paper, we introduce a general framework to design multiplicative updates (MU) for NMF based on $\beta$-divergences ($\beta$-NMF) with disjoint equality constraints, penalty terms in objective function. By disjoint, mean that each variable appears at most one constraint. Our MU satisfy set constraints after update variables during optimization process, while guaranteeing function decreases monotonically. We showcase three models, show it competes favorably state art: (1)~$\beta$-NMF sum-to-one columns $H$, (2) minimum-volume $\beta$-NMF $W$, (3) sparse $\ell_2$-norm $W$.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multiplicative Updates for Elastic Net Regularized Convolutional NMF Under $\beta$-Divergence

We generalize the convolutional NMF by taking the β-divergence as the loss function, add a regularizer for sparsity in the form of an elastic net, and provide multiplicative update rules for its factors in closed form. The new update rules embed the β-NMF, the standard convolutional NMF, and sparse coding alias basis pursuit. We demonstrate that the originally published update rules for the con...

متن کامل

A Unified Convergence Analysis of the Multiplicative Update Algorithm for Regularized NMF with General Divergences

The multiplicative update (MU) algorithm has been used extensively to estimate the basis and coefficient matrices in nonnegative matrix factorization (NMF) problems under a wide range of divergences and regularizations. However, theoretical convergence guarantees have only been derived for a few special divergences and without regularizers. We provide a conceptually simple, self-contained, and ...

متن کامل

Multiplicative Updates for Learning with Stochastic Matrices

Stochastic matrices are arrays whose elements are discrete probabilities. They are widely used in techniques such as Markov Chains, probabilistic latent semantic analysis, etc. In such learning problems, the learned matrices, being stochastic matrices, are non-negative and all or part of the elements sum up to one. Conventional multiplicative updates which have been widely used for nonnegative ...

متن کامل

Clustering with Beta Divergences

Clustering algorithms start with a fixed divergence, which captures the possibly asymmetric distance between a sample and a centroid. In the mixture model setting, the sample distribution plays the same role. When all attributes have the same topology and dispersion, the data are said to be homogeneous. If the prior knowledge of the distribution is inaccurate or the set of plausible distributio...

متن کامل

Multiplicative updates For Non-Negative Kernel SVM

We present multiplicative updates for solving hard and soft margin support vector machines (SVM) with non-negative kernels. They follow as a natural extension of the updates for non-negative matrix factorization. No additional parameter setting, such as choosing learning, rate is required. Experiments demonstrate rapid convergence to good classifiers. We analyze the rates of asymptotic converge...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2021

ISSN: ['1095-7162', '0895-4798']

DOI: https://doi.org/10.1137/20m1377278